A device reads people's thoughts in real time only if they imagine the password "chittychittybangbang"

A team of scientists has managed to read the thoughts of four people with severe paralysis in real time. The device used, implanted in the brain, is capable of capturing imagined phrases, without the participants having to physically attempt to speak, as was the case in most previous similar projects . The researchers, from Stanford University (USA), acknowledge their concern about "mental privacy" and the possibility of "accidental leakage of internal thoughts." To protect each user's inner world, the authors have made the brain reader only activate when imagining a complex password, one that is difficult to think of in everyday life: "chittychittybangbang," like the famous 1960s children's book and film about the inventor of a flying car.
“This is the first time that complete sentences of inner speech have been decoded in real time from a large vocabulary of possible words,” Stanford neuroscientist Benyamin Abramovich tells EL PAÍS. The researcher recalls that, a year ago, a team from the California Institute of Technology managed to read this inner speech in two people with tetraplegia, using microelectrodes implanted under the crown of the head, but it was an experiment with only eight words . Abramovich and his colleagues claim that their device can detect 125,000 imagined words. Their results were published this Thursday in the journal Cell .
The Stanford group implanted microelectrodes in the motor cortex of three people with amyotrophic lateral sclerosis and a woman with tetraplegia and speech difficulties following a stroke. The authors report that they were able to read their "internal monologues" with 74% accuracy, without requiring the participants to make the laborious effort of trying to speak. An artificial intelligence program facilitated the interpretation of the brain signals.
“Our results wouldn't have been possible with non-invasive technologies. It would be like trying to record two people's conversations inside a football stadium during a match. A microphone placed right next to them could perfectly isolate their voices. A microphone outside the stadium might be able to tell you when a goal is scored, but it's impossible to determine the content of a person's conversation,” argues Abramovich, the study's lead author along with electrical engineer Erin Kunz .
The idea of a headband placed on the head that can read thoughts without surgery is still a long way off. “The neural interface used in our study can record the activity of individual neurons in the brain, similar to the microphone next to someone's mouth inside a stadium. Noninvasive brain-recording technologies are like the microphone outside the stadium: they can pick up signals related to important events, but not detailed information like internal speech,” Abramovich adds.
Spanish neuroscientist Rafael Yuste visited the White House in Washington at the end of 2021, invited by the United States National Security Council, to warn of the imminent arrival of a world in which people will connect to the internet directly with their brains, using headbands or caps capable of reading thoughts. In this hypothetical future, artificial intelligence could autocomplete imaginations, as already happens in word processors. Companies such as Apple, Meta (formerly Facebook), and Neuralink (owned by magnate Elon Musk) have patented or are developing wearable devices of this type, according to Yuste, director of the Center for Neurotechnology at Columbia University in New York.
The Spanish researcher recalls that two years ago, another scientific team managed to read 78 words per minute in the brain of Ann , a woman who had lost her speech almost two decades earlier due to a stroke. That group, led by neurosurgeon Edward Chang at the University of California, San Francisco, achieved 75% accuracy with implanted electrodes, but Ann had to physically attempt to speak . A normal conversation in English is about 150 words per minute.
Yuste downplays the distinction between attempted speech and interior monologue. “I think it's essentially a semantic difference, because it hasn't been proven that neurons distinguish between the two cases, and there's a lot of evidence that when you think about a movement, motor neurons are activated, even if you don't actually execute it,” says the neuroscientist, who is leading an international campaign to have authorities legally protect citizens' mental privacy. In his opinion, the Stanford and San Francisco experiments “demonstrate that language can be decoded using implantable neurotechnology.”
“It is urgent to protect neurological rights and legislate for the protection of neural data,” says Yuste, president of the Neurorights Foundation , dedicated to raising awareness about the ethical implications of neurotechnology. His awareness-raising work led Chile to become the first country to take a step toward protecting brain information in its Constitution in 2021. The foundation has also promoted similar legislation in the Brazilian state of Rio Grande do Sul and in Colorado, California, Montana, and Connecticut in the United States. In Spain, Cantabria is promoting the first European law to protect brain data.
Yuste applauds the Stanford team for establishing the "chittychittybangbang" password to activate their mind reader. "Including a phrase as an internal password that prevents decoding is novel and can protect mental privacy," he says. Neurosurgeon Frank Willett , co-director of the Stanford lab, proclaimed in a statement that the "chittychittybangbang" password was "extremely effective in preventing the internal monologue, when thought without the intention of sharing it, from being leaked."
EL PAÍS